Inner Product Laplacian Embedding Based on Semidefinite Programming
نویسنده
چکیده
This paper proposes an inner product Laplacian embedding algorithm based on semi-definite programming, named as IPLE algorithm. The new algorithm learns a geodesic distance-based kernel matrix by using semi-definite programming under the constraints of local contraction. The criterion function is to make the neighborhood points on manifold as close as possible while the geodesic distances between those distant points are preserved. The IPLE algorithm sufficiently integrates the advantages of LE, ISOMAP and MVU algorithms. The comparison experiments on two image datasets from COIL-20 images and USPS handwritten digit images are performed by applying LE, ISOMAP, MVU and the proposed IPLE. Experimental results show that the intrinsic low-dimensional coordinates obtained by our algorithm preserve more information according to the fraction of the dominant eigenvalues and can obtain the better comprehensive performance in clustering and manifold structure.
منابع مشابه
A Combinatorial Algorithm for Minimizing the Maximum Laplacian Eigenvalue of Weighted Bipartite Graphs
We give a strongly polynomial time combinatorial algorithm to minimise the largest eigenvalue of the weighted Laplacian of a bipartite graph G = (W ∪B,E). This is accomplished by solving the dual graph embedding problem which arises from a semidefinite programming formulation. In particular, the problem for trees can be solved in time O(|W ∪B|).
متن کاملExtensions and Analysis of Local Non-linear Techniques
The techniques Conformal Eigenmap and Neighborhood Preserving Embedding (NPE) have been proposed as extensions of local non-linear techniques. Many of the commonly used non-linear dimensionality reduction, such as Local Linear Embedding (LLE) and Laplacian eigenmap are not explicitly designed to preserve local features such as distances or angles. In first proposed Conformal Eigenmap technique,...
متن کاملFast Graph Laplacian Regularized Kernel Learning via Semidefinite-Quadratic-Linear Programming
Kernel learning is a powerful framework for nonlinear data modeling. Using the kernel trick, a number of problems have been formulated as semidefinite programs (SDPs). These include Maximum Variance Unfolding (MVU) (Weinberger et al., 2004) in nonlinear dimensionality reduction, and Pairwise Constraint Propagation (PCP) (Li et al., 2008) in constrained clustering. Although in theory SDPs can be...
متن کاملEnhanced Multilevel Manifold Learning
Two multilevel frameworks for manifold learning algorithms are discussed which are based on an affinity graph whose goal is to sketch the neighborhood of each sample point. One framework is geometric and is suitable for methods aiming to find an isometric or a conformal mapping, such as isometric feature mapping (Isomap) and semidefinite embedding (SDE). The other is algebraic and can be incorp...
متن کاملA path following interior-point algorithm for semidefinite optimization problem based on new kernel function
In this paper, we deal to obtain some new complexity results for solving semidefinite optimization (SDO) problem by interior-point methods (IPMs). We define a new proximity function for the SDO by a new kernel function. Furthermore we formulate an algorithm for a primal dual interior-point method (IPM) for the SDO by using the proximity function and give its complexity analysis, and then we sho...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- J. Signal and Information Processing
دوره 2 شماره
صفحات -
تاریخ انتشار 2011